Variational Bayes via Propositionalization

نویسندگان

  • Taisuke Sato
  • Yoshitaka Kameya
  • Kenichi Kurihara
چکیده

In this paper, we propose a unified approach to VB (variational Bayes) [1] in symbolicstatistical modeling via propositionalization1. By propositionalization we mean, broadly, expressing and computing probabilistic models such as BNs (Bayesian networks) [2] and PCFGs (probabilistic context free grammars) [3] in terms of propositional logic that considers propositional variables as binary random variables. Our proposal is motivated by three observations. The first one is that propositionalization, or more precisely PPC (propositionalized probability computation), i.e. probability computation formulated in a propositional setting, has turned out to be general and efficient when variable values are sparsely interdependent. Examples include (discrete) BNs, PCFGs and more generally PRISM [4, 5] which is a probabilistic logic programming language we have been developing that computes probabilities using graphically represented AND/OR boolean formulas. Efficacy of PPC is already proved by the Inside-Outside algorithm in the case of PCFGs and by recent PPC approaches to BNs such as the one by Chavira et al. that exploits 0 probability and CSI (context specific independence) [6]. Mateescu et al. introduced AND/OR search tress which is a propositional representation of bucket trees and revealed that PPC is a general computation machanism for BNs [7]. Second of all, while VB has been around for sometime as a powerful tool for Bayesian modeling [1], it’s use is restricted to somewhat simple models such as BNs and HMMs (hidden Markov models) [8] though its usefulness is established through a variety of applications from model selection to prediction. On the other hand it is already proved that VB can be extended to PCFGs and efficiently implementable using dynamic programming [9]. Note that PCFGs are just one class of PPC and much more general PPC is already realized in PRISM. Accordingly if VB is combined with PRISM’s PPC, we will obtain VB for general probabilistic models, far wider than BNs and PCFGs. The last observation is that currently deriving and implementing a VB algorithm is an error-prone time-consuming process. In addition ensuring its correctness beyond PCFGs seems a non-trivial task. However once VB becomes available in PRISM, it will save considerable time and effort. That is, we do not have to derive a new VB algorithm from scratch nor have to implement it. All we have to do is just to write a probabilistic

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data

We introduce Deep Variational Bayes Filters (DVBF), a new method for unsupervised learning and identification of latent Markovian state space models. Leveraging recent advances in Stochastic Gradient Variational Bayes, DVBF can overcome intractable inference distributions via variational inference. Thus, it can handle highly nonlinear input data with temporal and spatial dependencies such as im...

متن کامل

Hierarchical Variational Models (Appendix)

Relationship to empirical Bayes and RL. The augmentation with a variational prior has strong ties to empirical Bayesian methods, which use data to estimate hyperparameters of a prior distribution (Robbins, 1964; Efron & Morris, 1973). In general, empirical Bayes considers the fully Bayesian treatment of a hyperprior on the original prior—here, the variational prior on the original meanfield—and...

متن کامل

Functional regression via variational Bayes.

We introduce variational Bayes methods for fast approximate inference in functional regression analysis. Both the standard cross-sectional and the increasingly common longitudinal settings are treated. The methodology allows Bayesian functional regression analyses to be conducted without the computational overhead of Monte Carlo methods. Confidence intervals of the model parameters are obtained...

متن کامل

An Alternative View of Variational Bayes and Minimum Variational Stochastic Complexity

Bayesian learning is widely used in many applied datamodelling problems and is often accompanied with approximation schemes since it requires intractable computation of the posterior distributions. In this study, we focus on the two approximation methods, the variational Bayes and the local variational approximation. We show that the variational Bayes approach for statistical models with latent...

متن کامل

Integrated Non-Factorized Variational Inference

We present a non-factorized variational method for full posterior inference in Bayesian hierarchical models, with the goal of capturing the posterior variable dependencies via efficient and possibly parallel computation. Our approach unifies the integrated nested Laplace approximation (INLA) under the variational framework. The proposed method is applicable in more challenging scenarios than ty...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007